Learning non-parametric Markov networks with mutual information

نویسندگان

  • Janne Leppä-aho
  • Santeri Räisänen
  • Xiao Yang
  • Teemu Roos
چکیده

We propose a method for learning Markov network structures for continuous data without invoking any assumptions about the distribution of the variables. The method makes use of previous work on a non-parametric estimator for mutual information which is used to create a non-parametric test for multivariate conditional independence. This independence test is then combined with an efficient constraint-based algorithm for learning the graph structure. The performance of the method is evaluated on several synthetic data sets and it is shown to learn considerably more accurate structures than competing methods when the dependencies between the variables involve non-linearities.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Feature Extraction by Non-Parametric Mutual Information Maximization

We present a method for learning discriminative feature transforms using as criterion the mutual information between class labels and transformed features. Instead of a commonly used mutual information measure based on Kullback-Leibler divergence, we use a quadratic divergence measure, which allows us to make an efficient non-parametric implementation and requires no prior assumptions about cla...

متن کامل

Markov chain order estimation with parametric significance tests of conditional mutual information

Besides the different approaches suggested in the literature, accurate estimation of the order of a Markov chain from a given symbol sequence is an open issue, especially when the order is moderately large. Here, parametric significance tests of conditional mutual information (CMI) of increasing order m, Ic(m), on a symbol sequence are conducted for increasing orders m in order to estimate the ...

متن کامل

A New Approach to Hybrid HMM/ANN Speech Recognition using Mutual Information Neural Networks

This paper presents a new approach to speech recognition with hybrid HMM/ANN technology. While the standard approach to hybrid HMMI ANN systems is based on the use of neural networks as posterior probability estimators, the new approach is based on the use of mutual information neural networks trained with a special learning algorithm in order to maximize the mutual information between the inpu...

متن کامل

— — — — — — — — — A New Approach to Hybrid HMM / ANN Speech

This paper presents a new approach to speech recognition with hybrid HMM/ANN technology. While the standard approach to hybrid HMM/ANN systems is based on the use of neural networks as posterior probability estimators, the new approach is based on the use of mutual information neural networks trained with a special learning algorithm in order to maximize the mutual information between the input...

متن کامل

Mutual information minimization: application to Blind Source Separation

In this paper, the problem of Blind Source Separation (BSS) through mutual information minimization is addressed. For mutual information minimization, multi-variate score functions are first introduced, which can be served to construct a non-parametric “gradient” for mutual information. Then, two general gradient based approaches for minimizing mutual information in a parametric model are prese...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1708.02497  شماره 

صفحات  -

تاریخ انتشار 2017